neural map
- Asia > Japan > Honshū > Kansai > Osaka Prefecture > Osaka (0.04)
- South America > Brazil > Rio de Janeiro > Rio de Janeiro (0.04)
- Asia > Taiwan > Taiwan > Taipei (0.04)
- (10 more...)
- Transportation > Ground > Road (0.46)
- Information Technology > Services (0.46)
- Asia > Japan > Honshū > Kansai > Osaka Prefecture > Osaka (0.05)
- South America > Brazil > São Paulo (0.04)
- North America > United States > New York (0.04)
- (11 more...)
- Transportation > Ground > Road (0.46)
- Information Technology > Services (0.46)
Active Neural Mapping at Scale
Kuang, Zijia, Yan, Zike, Zhao, Hao, Zhou, Guyue, Zha, Hongbin
Abstract-- We introduce a NeRF-based active mapping system that enables efficient and robust exploration of large-scale indoor environments. The key to our approach is the extraction of a generalized Voronoi graph (GVG) from the continually updated neural map, leading to the synergistic integration of scene geometry, appearance, topology, and uncertainty. Anchoring uncertain areas induced by the neural map to the vertices of GVG allows the exploration to undergo adaptive granularity along a safe path that traverses unknown areas efficiently. Harnessing a modern hybrid NeRF representation, the proposed system achieves competitive results in terms of reconstruction accuracy, coverage completeness, and exploration efficiency even when scaling up to large indoor environments. Extensive results at different scales validate the efficacy of the proposed system.
Generating Synthetic Datasets by Interpolating along Generalized Geodesics
Fan, Jiaojiao, Alvarez-Melis, David
Data for pretraining machine learning models often consists of collections of heterogeneous datasets. Although training on their union is reasonable in agnostic settings, it might be suboptimal when the target domain -- where the model will ultimately be used -- is known in advance. In that case, one would ideally pretrain only on the dataset(s) most similar to the target one. Instead of limiting this choice to those datasets already present in the pretraining collection, here we explore extending this search to all datasets that can be synthesized as `combinations' of them. We define such combinations as multi-dataset interpolations, formalized through the notion of generalized geodesics from optimal transport (OT) theory. We compute these geodesics using a recent notion of distance between labeled datasets, and derive alternative interpolation schemes based on it: using either barycentric projections or optimal transport maps, the latter computed using recent neural OT methods. These methods are scalable, efficient, and -- notably -- can be used to interpolate even between datasets with distinct and unrelated label sets. Through various experiments in transfer learning in computer vision, we demonstrate this is a promising new approach for targeted on-demand dataset synthesis.
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- North America > United States > Georgia > Fulton County > Atlanta (0.04)
Neuromorphic hardware as a self-organizing computing system
Khacef, Lyes, Girau, Bernard, Rougier, Nicolas, Upegui, Andres, Miramond, Benoit
This paper presents the self-organized neuromorphic architecture named SOMA. The objective is to study neural-based self-organization in computing systems and to prove the feasibility of a self-organizing hardware structure. Considering that these properties emerge from large scale and fully connected neural maps, we will focus on the definition of a self-organizing hardware architecture based on digital spiking neurons that offer hardware efficiency. From a biological point of view, this corresponds to a combination of the so-called synaptic and structural plasticities. We intend to define computational models able to simultaneously self-organize at both computation and communication levels, and we want these models to be hardware-compliant, fault tolerant and scalable by means of a neuro-cellular structure.
- Europe > Switzerland (0.05)
- Europe > Ireland > Munster > County Kerry > Killarney (0.05)
- Europe > France > Provence-Alpes-Côte d'Azur (0.05)
ViewpointS: towards a Collective Brain
Lemoisson, Philippe, Cerri, Stefano A.
Tracing knowledge acquisition and linking learning events to interaction between peers is a major challenge of our times. We have conceived, designed and evaluated a new paradigm for constructing and using collective knowledge by Web interactions that we called ViewpointS. By exploiting the similarity with Edelman's Theory of Neuronal Group Selection (TNGS), we conjecture that it may be metaphorically considered a Collective Brain, especially effective in the case of trans-disciplinary representations. Far from being without doubts, in the paper we present the reasons (and the limits) of our proposal that aims to become a useful integrating tool for future quantitative explorations of individual as well as collective learning at different degrees of granu-larity. We are therefore challenging each of the current approaches: the logical one in the semantic Web, the statistical one in mining and deep learning, the social one in recommender systems based on authority and trust; not in each of their own preferred field of operation, rather in their integration weaknesses far from the holistic and dynamic behavior of the human brain.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.14)
- Europe > France > Occitanie > Hérault > Montpellier (0.05)
- North America > United States > Michigan (0.04)
Harvard awarded 19m to build brain-inspired artificial intelligence (Wired UK)
Harvard University has been awarded 28 million ( 19m) to investigate why brains are so much better at learning and retaining information than artificial intelligence. The award, from the Intelligence Advanced Projects Activity (IARPA), could help make AI systems faster, smarter and more like human brains. While many computers have a comparable storage capacity, their ability to recognise patterns and learn information does not match the human brain. But a better understanding of how neurons are connected could help develop more complex artificial intelligence. Most neuroscientists estimate that the'storage capacity' for the human brain ranges between 10 and 100 terabytes, with some evaluations putting the number at close to 2.5 petabytes.